I’m currently testing different SEO tactics on a new site that I built as part of a contest where I work. We were all given the same keyword to try to rank for, and at the end of 60 days, whoever is in the first position in the organic SERPs in Google is the winner. The only stipulations were that we had to buy a new domain, couldn’t use an existing one, and we couldn’t put any form of the keyword into the domain name. Oh yeah, and we couldn’t spend more than $50 outside of domain registration and web hosting.
My goal was to use only white hat tactics and it was all going great guns for the first couple of weeks. Of course, that’s because my site had the #1 and #2 positions. Then my web hosting company ate my homework. But I didn’t know it for several days. My site started slipping in the SERPs and my home page dropped out and other pages from the site showed up.
My first thought was that I had let several days go by without adding fresh content, and my competitors had painted a big target on my back.
I have since discovered that something happened with my web host, and the server that my site is on is now serving up my pages at a rate somewhat slower than pond water. Gah! If you ever wondered if slow server performance could really hurt a site’s rankings, I have the proof! Today only one page from my site remains in Google’s index, and it’s in 7th position.
However, when I checked my Dashboard, rather than telling me they dropped my pages from the index because my web server sucks, Google said they can’t reach my robots.txt. That’s probably because I didn’t have one.
Google says that my robots.txt file was unreachable, so they’re not going to crawl my site. Wait, what? They didn’t get a 404 error when requesting a robots.txt file, so they went away. And when did “roboted out” become a verb?
I’m moving the site to a new host today, but this web hosting company has probably caused me to lose the contest!